Non-Asymptotic and Asymptotic Analyses of Information Processing on Markov Chains

نویسندگان

  • Masahito Hayashi
  • Shun Watanabe
چکیده

We study finite-length bounds for source coding with side information for Markov sources and channel coding for channels with conditional Markovian additive noise. For this purpose, we propose two criteria for finite-length bounds. One is the asymptotic optimality and the other is the efficient computability of the bound. Then, we derive finite-length upper and lower bounds for coding length in both settings so that their computational complexity is efficient. To discuss the first criterion, we derive the large deviation bounds, the moderate deviation bounds, and second order bounds for these two topics, and show that these finite-length bounds achieves the asymptotic optimality in these senses. For this discussion, we introduce several kinds of information measure for transition matrices. Index Terms Channel Coding, Markov Chain, Finite-Length Analysis, Source Coding

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Perturbation Analysis in Verification of Discrete-Time Markov Chains

Perturbation analysis in probabilistic verification addresses the robustness and sensitivity problem for verification of stochastic models against qualitative and quantitative properties. We identify two types of perturbation bounds, namely non-asymptotic bounds and asymptotic bounds. Non-asymptotic bounds are exact, pointwise bounds that quantify the upper and lower bounds of the verification ...

متن کامل

Best proximity point theorems in Hadamard spaces using relatively asymptotic center

In this article we survey the existence of best proximity points for a class of non-self mappings which‎ satisfy a particular nonexpansiveness condition. In this way, we improve and extend a main result of Abkar and Gabeleh [‎A‎. ‎Abkar‎, ‎M‎. ‎Gabeleh‎, Best proximity points of non-self mappings‎, ‎Top‎, ‎21, (2013)‎, ‎287-295]‎ which guarantees the existence of best proximity points for nonex...

متن کامل

Improving Asymptotic Variance of MCMC Estimators: Non-reversible Chains are Better

I show how any reversible Markov chain on a finite state space that is irreducible, and hence suitable for estimating expectations with respect to its invariant distribution, can be used to construct a non-reversible Markov chain on a related state space that can also be used to estimate these expectations, with asymptotic variance at least as small as that using the reversible chain (typically...

متن کامل

Risk-sensitive probability for Markov chains

The probability distribution of a Markov chain is viewed as the information state of an additive optimization problem. This optimization problem is then generalized to a product form whose information state gives rise to a generalized notion of probability distribution for Markov chains. The evolution and the asymptotic behavior of this generalized or “risk-sensitive” probability distribution i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013